AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Macedonian language optimization

# Macedonian language optimization

MKLLM 7B Instruct
MKLLM-7B is an open-source large language model for the Macedonian language, built through continued pre-training on mixed Macedonian and English texts based on the Mistral-7B-v0.1 model.
Large Language Model Transformers Supports Multiple Languages
M
trajkovnikola
31
8
XLMR BERTovski
A language model pretrained on large-scale Bulgarian and Macedonian texts, part of the MaCoCu project
Large Language Model Other
X
MaCoCu
36
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase